Thinking in Systems
🚀 The Book in 3 Sentences
This book is about understanding and conveying how the system works and how you can use systems thinking to understand the world differently. Systems have mechanisms that govern their behavior, and system thinking allows you to model how the system behaves when manipulating it.
🎨 Impressions
It was a good book, precise and understandable. This was how engineers thought since I remember very well how we used to model things the same way in school. This book shows how you can apply this type of mental modelling to many different use cases. This is important, so I am pleased about it.
One thing I appreciated in the book was that they say that everything we perceive about the world is a model, language, etc. The models are good, but they need to depict the world accurately. When I am reading The Blind Watchmaker by Richard Dawkins, I think that since our models shape our worldview, we cannot understand that bats might use echolocation the same way we use vision as it is inconsistent with our models. Since we can't use echolocation ourselves, our models cannot phantom that the bats are using it.
I became even more interested in applying system thinking and just modelling systems based upon the systems-thinking progression. I am increasingly convinced that how I feel is a good way of understanding the world, even though I am increasingly convinced that my knowledge is extremely minuscule and I need to be very humble so as not to mess up the way I'm working too much.
✍️ My Top Quotes
-
If a frog turns right and catches a fly, then turns left and catches a fly, and then turns around backward and catches a fly, the purpose of the frog has to do not with turning left or right or backward but with catching flies. If a government proclaims its interest in protecting the environment but allocates little money or effort toward that goal, environmental protection is not, in fact, the government’s purpose. Purposes are deduced from behavior, not from rhetoric or stated goals.
-
Managers are not confronted with problems that are independent of each other, but with dynamic situations that consist of complex systems of changing problems that interact with each other. I call such situations messes. . . . Managers do not solve problems, they manage messes. —RUSSELL ACKOFF, 1 operations theorist
-
Once we see the relationship between structure and behavior, we can begin to understand how systems work, what makes them produce poor results, and how to shift them into better behavior patterns.
-
The only way to fix a system that is laid out poorly is to rebuild it, if you can. Amory Lovins and his team at Rocky Mountain Institute have done wonders on energy conservation by simply straightening out bent pipes and enlarging ones that are too small. If we did similar energy retrofits on all the buildings in the United States, we could shut down many of our electric power plants.
-
I have yet to see any problem, however complicated, which, when looked at in the right way, did not become still more complicated. —POUL ANDERSON
-
A feedback loop is formed when changes in a stock affect the flows into or out of that same stock. A feedback loop can be quite simple and direct. Think of an interest-bearing savings account in a bank. The total amount of money in the account (the stock) affects how much money comes into the account as interest. That is because the bank has a rule that the account earns a certain percent interest each year. The total dollars of interest paid into the account each year (the flow in) is not a fixed amount, but varies with the size of the total in the account.
-
Keeping sub-purposes and overall system purposes in harmony is an essential function of successful systems. I’ll get back to this point later when we come to hierarchies.
-
Changes in function or purpose also can be drastic. What if you keep the players and the rules but change the purpose—from winning to losing, for example? What if the function of a tree were not to survive and reproduce but to capture all the nutrients in the soil and grow to unlimited size? People have imagined many purposes for a university besides disseminating knowledge—making money, indoctrinating people, winning football games. A change in purpose changes a system profoundly, even if every element and interconnection remains the same.
-
Stocks allow inflows and outflows to be decoupled and to be independent and temporarily out of balance with each other.
-
Remember—all system diagrams are simplifications of the real world.
-
Balancing feedback loops are equilibrating or goal-seeking structures in systems and are both sources of stability and sources of resistance to change.
-
Reinforcing feedback loops are self-enhancing, leading to exponential growth or to runaway collapses over time. They are found whenever a stock has the capacity to reinforce or reproduce itself.
-
the more machines and factories (collectively called “capital”) you have, the more goods and services (“output”) you can produce. The more output you can produce, the more you can invest in new machines and factories. The more you make, the more capacity you have to make even more. This reinforcing feedback loop is the central engine of growth in an economy.
-
Because we bump into reinforcing loops so often, it is handy to know this shortcut: The time it takes for an exponentially growing stock to double in size, the “doubling time,” equals approximately 70 divided by the growth rate (expressed as a percentage).
-
Many relationships in systems are nonlinear. Their relative strengths shift in disproportionate amounts as the stocks in the system shift. Nonlinearities in feedback systems produce shifting dominance of loops and many complexities in system behavior.
-
This concept of a limiting factor is simple and widely misunderstood. Agronomists assume, for example, that they know what to put in artificial fertilizer, because they have identified many of the major and minor nutrients in good soil. Are there any essential nutrients they have not identified? How do artificial fertilizers affect soil microbe communities? Do they interfere with, and therefore limit, any other functions of good soil? And what limits the production of artificial fertilizers? At any given time, the input that is most important to a system is the one that is most limiting.
-
Complex systems can evolve from simple systems only if there are stable intermediate forms. The resulting complex forms will naturally be hierarchic. That may explain why hierarchies are so common in the systems nature presents to us. Among all possible complex forms, hierarchies are the only ones that have had the time to evolve.5 Paraphrased from Herbert Simon,
-
System structure is the source of system behavior. System behavior reveals itself as a series of events over time.
-
Model utility depends not on whether its driving scenarios are realistic (since no one can know that for sure), but on whether it responds with a realistic pattern of behavior. What is adjusting the inflows and outflows?
-
Systems • A system is more than the sum of its parts. • Many of the interconnections in systems operate through the flow of information. • The least obvious part of the system, its function or purpose, is often the most crucial determinant of the system’s behavior. • System structure is the source of system behavior. System behavior reveals itself as a series of events over time.   Stocks, Flows, and Dynamic Equilibrium • A stock is the memory of the history of changing flows within the system. • If the sum of inflows exceeds the sum of outflows, the stock level will rise. • If the sum of outflows exceeds the sum of inflows, the stock level will fall. • If the sum of outflows equals the sum of inflows, the stock level will not change — it will be held in dynamic equilibrium. • A stock can be increased by decreasing its outflow rate as well as by increasing its inflow rate. • Stocks act as delays or buffers or shock absorbers in systems. • Stocks allow inflows and outflows to be de-coupled and independent.   Feedback Loops • A feedback loop is a closed chain of causal connections from a stock, through a set of decisions or rules or physical laws or actions that are dependent on the level of the stock, and back again through a flow to change the stock. • Balancing feedback loops are equilibrating or goal-seeking structures in systems and are both sources of stability and sources of resistance to change. • Reinforcing feedback loops are self-enhancing, leading to exponential growth or to runaway collapses over time. • The information delivered by a feedback loop—even nonphysical feedback—can affect only future behavior; it can’t deliver a signal fast enough to correct behavior that drove the current feedback. • A stock-maintaining balancing feedback loop must have its goal set appropriately to compensate for draining or inflowing processes that affect that stock. Otherwise, the feedback process will fall short of or exceed the target for the stock. • Systems with similar feedback structures produce similar dynamic behaviors.   Shifting Dominance, Delays, and Oscillations • Complex behaviors of systems often arise as the relative strengths of feedback loops shift, causing first one loop and then another to dominate behavior. • A delay in a balancing feedback loop makes a system likely to oscillate. • Changing the length of a delay may make a large change in the behavior of a system.   Scenarios and Testing Models • System dynamics models explore possible futures and ask “what if” questions. • Model utility depends not on whether its driving scenarios are realistic (since no one can know that for sure), but on whether it responds with a realistic pattern of behavior.   Constraints on Systems • In physical, exponentially growing systems, there must be at least one reinforcing loop driving the growth and at least one balancing loop constraining the growth, because no system can grow forever in a finite environment. • Nonrenewable resources are stock-limited. • Renewable resources are flow-limited.   Resilience, Self-Organization, and Hierarchy • There are always limits to resilience. • Systems need to be managed not only for productivity or stability, they also need to be managed for resilience. • Systems often have the property of self-organization—the ability to structure themselves, to create new structure, to learn, diversify, and complexify. • Hierarchical systems evolve from the bottom up. The purpose of the upper layers of the hierarchy is to serve the purposes of the lower layers.   Source of System Surprises • Many relationships in systems are nonlinear. • There are no separate systems. The world is a continuum. Where to draw a boundary around a system depends on the purpose of the discussion. • At any given time, the input that is most important to a system is the one that is most limiting. • Any physical entity with multiple inputs and outputs is surrounded by layers of limits. • There always will be limits to growth. • A quantity growing exponentially toward a limit reaches that limit in a surprisingly short time. • When there are long delays in feedback loops, some sort of foresight is essential. • The bounded rationality of each actor in a system may not lead to decisions that further the welfare of the system as a whole.   Mindsets and Models • Everything we think we know about the world is a model. • Our models do have a strong congruence with the world. • Our models fall far short of representing the real world fully.
-
“High leverage, wrong direction,” the system-thinking car dealer says to herself as she watches this failure of a policy intended to stabilize the oscillations. This perverse kind of result can be seen all the time—someone trying to fix a system is attracted intuitively to a policy lever that in fact does have a strong effect on the system. And then the well-intentioned fixer pulls the lever in the wrong direction! This is just one example of how we can be surprised by the counterintuitive behavior of systems when we start trying to change them.
-
Delays in feedback loops are critical determinants of system behavior. They are common causes of oscillations. If you’re trying to adjust a stock (your store inventory) to meet your goal, but you receive only delayed information about what the state of the stock is, you will overshoot and undershoot your goal.
-
In short, this book is poised on a duality. We know a tremendous amount about how the world works, but not nearly enough. Our knowledge is amazing; our ignorance even more so. We can improve our understanding, but we can’t make it perfect.
-
I’ve shown three sets of possible behaviors of this renewable resource system here: • overshoot and adjustment to a sustainable equilibrium, • overshoot beyond that equilibrium followed by oscillation around it, and • overshoot followed by collapse of the resource and the industry dependent on the resource.
-
Nonrenewable resources are stock-limited. The entire stock is available at once, and can be extracted at any rate (limited mainly by extraction capital). But since the stock is not renewed, the faster the extraction rate, the shorter the lifetime of the resource. Renewable resources are flowlimited. They can support extraction or harvest indefinitely, but only at a finite flow rate equal to their regeneration rate. If they are extracted faster than they regenerate, they may eventually be driven below a critical threshold and become, for all practical purposes, nonrenewable.
-
Resilience has many definitions, depending on the branch of engineering, ecology, or system science doing the defining. For our purposes, the normal dictionary meaning will do: “the ability to bounce or spring back into shape, position, etc., after being pressed or stretched. Elasticity. The ability to recover strength, spirits, good humor, or any other aspect quickly.” Resilience is a measure of a system’s ability to survive and persist within a variable environment. The opposite of resilience is brittleness or rigidity.
-
Evolution appears to be not a series of accidents the course of which is determined only by the change of environments during earth history and the resulting struggle for existence, . . . but is governed by definite laws. . . . The discovery of these laws constitutes one of the most important tasks of the future. —Ludwig von Bertalanffy, biologist
-
When a subsystem’s goals dominate at the expense of the total system’s goals, the resulting behavior is called suboptimization.
-
Resilience, self-organization, and hierarchy are three of the reasons dynamic systems can work so well. Promoting or managing for these properties of a system can improve its ability to function well over the long term—to be sustainable. But watching how systems behave also can be full of surprises. Hierarchical systems evolve from the bottom up. The purpose of the upper layers of the hierarchy is to serve the purposes of the lower layers.
- *Everything we think we know about the world is a model. Every word and every language is a model. All maps and statistics, books and databases, equations and computer programs are models. So are the ways I picture the world in my head—my mental models. None of these is or ever will be the real world.
- *Our models usually have a strong congruence with the world. That is why we are such a successful species in the biosphere. Especially complex and sophisticated are the mental models we develop from direct, intimate experience of nature, people, and organizations immediately around us.
- However, and conversely, our models fall far short of representing the world fully. That is why we make mistakes and why we are regularly surprised. In our heads, we can keep track of only a few variables at one time. We often draw illogical conclusions from accurate assumptions, or logical conclusions from inaccurate assumptions. Most of us, for instance, are surprised by the amount of growth an exponential process can generate. Few of us can intuit how to damp oscillations in a complex system.
-
Everything we think we know about the world is a model. Our models do have a strong congruence with the world. Our models fall far short of representing the real world fully.
-
Linear relationships are easy to think about: the more the merrier. Linear equations are solvable, which makes them suitable for textbooks. Linear systems have an important modular virtue: you can take them apart and put them together again—the pieces add up. Nonlinear systems generally cannot be solved and cannot be added together. . . . Nonlinearity means that the act of playing the game has a way of changing the rules. . . . That twisted changeability makes nonlinearity hard to calculate, but it also creates rich kinds of behavior that never occur in linear systems. —James Gleick, author of Chaos: Making a New Science
-
There are two antidotes to eroding goals. One is to keep standards absolute, regardless of performance. Another is to make goals sensitive to the best performances of the past, instead of the worst. If perceived performance has an upbeat bias instead of a downbeat one, if one takes the best results as a standard, and the worst results only as a temporary setback, then the same system structure can pull the system up to better and better performance. The reinforcing loop going downward, which said “the worse things get, the worse I’m going to let them get,” becomes a reinforcing loop going upward: “The better things get, the harder I’m going to work to make them even better.” If I had applied that lesson to my jogging, I’d be running marathons by now.
-
Rich countries transfer capital or technology to poor ones and wonder why the economies of the receiving countries still don’t develop, never thinking that capital or technology may not be the most limiting factors.
-
We are surprised over and over again at how much time things take. Jay Forrester used to tell us, when we were modeling a construction or processing delay, to ask everyone in the system how long they thought the delay was, make our best guess, and then multiply by three. (That correction factor also works perfectly, I have found, for estimating how long it will take to write a book!)
-
Bounded rationality means that people make quite reasonable decisions based on the information they have. But they don’t have perfect information, especially about more distant parts of the system. Fishermen don’t know how many fish there are, much less how many fish will be caught by other fishermen that same day.
-
Rational elites . . . know everything there is to know about their self-contained technical or scientific worlds, but lack a broader perspective. They range from Marxist cadres to Jesuits, from Harvard MBAs to army staff officers. . . . They have a common underlying concern: how to get their particular system to function. Meanwhile . . . civilization becomes increasingly directionless and incomprehensible. —John Ralston Saul,1 political scientist
-
THE TRAP: POLICY RESISTANCE When various actors try to pull a system stock toward various goals, the result can be policy resistance. Any new policy, especially if it’s effective, just pulls the stock farther from the goals of other actors and produces additional resistance, with a result that no one likes, but that everyone expends considerable effort in maintaining. THE WAY OUT Let go. Bring in all the actors and use the energy formerly expended on resistance to seek out mutually satisfactory ways for all goals to be realized—
-
Success to the successful is a well-known concept in the field of ecology, where it is called “the competitive exclusion principle.” This principle says that two different species cannot live in exactly the same ecological niche, competing for exactly the same resources. Because the two species are different, one will necessarily reproduce faster, or be able to use the resource more efficiently than the other. It will win a larger share of the resource, which will give it the ability to multiply more and keep winning. It will not only dominate the niche, it will drive the losing competitor to extinction. That will happen not by direct confrontation usually, but by appropriating all the resource, leaving none for the weaker competitor.
-
It’s not that parameters aren’t important—they can be, especially in the short term and to the individual who’s standing directly in the flow. People care deeply about such variables as taxes and the minimum wage, and so fight fierce battles over them. But changing these variables rarely changes the behavior of the national economy system.
-
You hear about catastrophic river floods much more often than catastrophic lake floods, because stocks that are big, relative to their flows, are more stable than small ones. In chemistry and other fields, a big, stabilizing stock is known as a buffer.
-
Physical structure is crucial in a system, but is rarely a leverage point, because changing it is rarely quick or simple. The leverage point is in proper design in the first place. After the structure is built, the leverage is in understanding its limitations and bottlenecks, using it with maximum efficiency, and refraining from fluctuations or expansions that strain its capacity.
-
Another of Jay Forrester’s famous systems sayings goes: It doesn’t matter how the tax law of a country is written. There is a shared idea in the minds of the society about what a “fair” distribution of the tax load is. Whatever the laws say, by fair means or foul, by complications, cheating, exemptions or deductions, by constant sniping at the rules, actual tax payments will push right up against the accepted idea of “fairness.”
-
Remember, always, that everything you know, and everything everyone knows, is only a model. Get your model out there where it can be viewed. Invite others to challenge your assumptions and add their own. Instead of becoming a champion for one possible explanation or hypothesis or model, collect as many as possible.
-
Through the Freedom of Information Act (from a systems point of view, one of the most important laws in the nation), that information became a matter of public record. In July 1988, the first data on chemical emissions became available. The reported emissions were not illegal, but they didn’t look very good when they were published in local papers by enterprising reporters, who had a tendency to make lists of “the top ten local polluters.” That’s all that happened. There were no lawsuits, no required reductions, no fines, no penalties. But within two years chemical emissions nationwide (at least as reported, and presumably also in fact) had decreased by 40 percent. Some
-
President Jimmy Carter had an unusual ability to think in feedback terms and to make feedback policies. Unfortunately, he had a hard time explaining them to a press and public that didn’t understand feedback. He suggested, at a time when oil imports were soaring, that there be a tax on gasoline proportional to the fraction of U.S. oil consumption that had to be imported. If imports continued to rise, the tax would rise until it suppressed demand and brought forth substitutes and reduced imports. If imports fell to zero, the tax would fall to zero. The tax never got passed.
-
Resilience: The ability of a system to recover from perturbation; the ability to restore or repair or bounce back after a change due to an outside force.
-
*Guidelines for Living in a World of Systems   Â
- Get the beat of the system.
- Expose your mental models to the light of day.
- Honor, respect, and distribute information.
- Use language with care and enrich it with systems concepts.
- Pay attention to what is important, not just what is quantifiable.
- Make feedback policies for feedback systems.
- Go for the good of the whole.
- Listen to the wisdom of the system.
- Locate responsibility within the system.
- Stay humble—stay a learner.
- Celebrate complexity.
- Expand time horizons.
- Defy the disciplines.
- Expand the boundary of caring.
- Don’t erode the goal of goodness.*